Informational Entropies for Non-ergodic Brains
نویسندگان
چکیده
Informational Entropies for Non-ergodic Brains Arturo Tozzi 1, James F. Peters 2,Marzieh Zare 3*, Mehmet Niyazi Çankaya 4 1 Center for Nonlinear Science, University of North Texas, Denton, Texas 76203, USA ; [email protected] 2 Department of Electrical and Computer Engineering, University of Manitoba 75A Chancellor’s Circle Winnipeg, MB R3T 5V6 Canada; [email protected] 3 Institute for Research in Fundamental Sciences, Tehran 19538 , Iran ; [email protected] 4 Deparment of Statistics, Faculty of Arts and Science, Usak University Usak Turkey; [email protected] * Correspondence: [email protected] Academic Editor: name Version October 4, 2016 submitted to Entropy; Typeset by LATEX using class file mdpi.cls Abstract: Informational entropies, although proved to be useful in the evaluation of nervous 1 function, are suitable just if we assume that nervous activity takes place under ergo dic conditions. 2 However, widespread claims suggest that the brain operates in a non-ergodic framework. Here 3 we show that a topological concept, namely the Borsuk-Ulam theorem, is able to wipe away 4 this long-standing limit of both Shannon entropy and its generalizations, such as Rényi’s. We 5 demonstrate that both ergodic and non-ergodic informational entropies can be evaluated and 6 quantified through topological methods, in order to improve our knowledge of central nervous 7 system function. 8
منابع مشابه
Non-linear ergodic theorems in complete non-positive curvature metric spaces
Hadamard (or complete $CAT(0)$) spaces are complete, non-positive curvature, metric spaces. Here, we prove a nonlinear ergodic theorem for continuous non-expansive semigroup in these spaces as well as a strong convergence theorem for the commutative case. Our results extend the standard non-linear ergodic theorems for non-expansive maps on real Hilbert spaces, to non-expansive maps on Ha...
متن کاملar X iv : m at h / 04 06 08 3 v 3 [ m at h . PR ] 1 7 Ju n 20 05 Large deviations for empirical entropies of g - measures
The entropy of an ergodic finite-alphabet process can be computed from a single typical sample path x1 using the entropy of the k-block empirical probability and letting k grow with n roughly like logn. We further assume that the distribution of the process is a g-measure. We prove large deviation principles for conditional, non-conditional and relative k(n)-block empirical entropies.
متن کاملLarge Deviations for Empirical Entropies of Gibbsian sources
The entropy of an ergodic finite-alphabet process can be computed from a single typical sample path x1 using the entropy of the k-block empirical probability and letting k grow with n roughly like logn. We further assume that the distribution of the process is a g-measure with respect to a continuous and regular g-function. We prove large deviation principles for conditional, non-conditional an...
متن کاملErgodic decomposition of excess entropy and conditional mutual information∗
The article discusses excess entropy defined as mutual information between the past and future of a stationary process. The central result is an ergodic decomposition: Excess entropy is the sum of self-information of shift-invariant σ-field and the average of excess entropies for the ergodic components of the process. The result is derived using generalized conditional mutual information for fi...
متن کاملGroups, information theory, and Einstein's likelihood principle.
We propose a unifying picture where the notion of generalized entropy is related to information theory by means of a group-theoretical approach. The group structure comes from the requirement that an entropy be well defined with respect to the composition of independent systems, in the context of a recently proposed generalization of the Shannon-Khinchin axioms. We associate to each member of a...
متن کامل